Note: When clicking on a Digital Object Identifier (DOI) number, you will be taken to an external site maintained by the publisher.
Some full text articles may not yet be available without a charge during the embargo (administrative interval).
What is a DOI Number?
Some links on this page may take you to non-federal websites. Their policies may differ from this site.
-
Ferromagnetic resonance (FMR) is a broadly used dynamical measurement used to characterize a wide range of magnetic materials. Applied research and development on magnetic thin film materials is growing rapidly alongside a growing commercial appetite for magnetic memory and computing technologies. The ability to execute high-quality, fast FMR surveys of magnetic thin films is needed to meet the demanding throughput associated with rapid materials exploration and quality control. Here, we implement optimal Bayesian experimental design software developed by [McMichael et al. J. Res. Natl. Inst. Stand. Technol. 126, 126002 (2021)] in a vector network analyzer-FMR setup to demonstrate an unexplored opportunity to accelerate FMR measurements. A systematic comparison is made between the optimal Bayesian measurement and the conventional measurement. Reduced uncertainties in the linewidth and resonance frequency ranging from 40% to 60% are achieved with the Bayesian implementation for the same measurement duration. In practical terms, this approach reaches a target uncertainty of ±5 MHz for the linewidth and ±1 MHz for the resonance frequency in 2.5× less time than the conventional approach. As the optimal Bayesian approach only decreases random errors, we evaluate how large systematic errors may limit the full advantage of the optimal Bayesian approach. This approach can be used to deliver gains in measurement speed by a factor of 3 or more and as a software add-on has the flexibility to be added on to any FMR measurement system to accelerate materials discovery and quality control measurements, alike.more » « less
-
Increased use of technology in schools raises new privacy and security challenges for K-12 students---and harms such as commercialization of student data, exposure of student data in security breaches, and expanded tracking of students---but the extent of these challenges is unclear. In this paper, first, we interviewed 18 school officials and IT personnel to understand what educational technologies districts use and how they manage student privacy and security around these technologies. Second, to determine if these educational technologies are frequently endorsed across United States (US) public schools, we compiled a list of linked educational technology websites scraped from 15,573 K-12 public school/district domains and analyzed them for privacy risks. Our findings suggest that administrators lack resources to properly assess privacy and security issues around educational technologies even though they do pose potential privacy issues. Based on these findings, we make recommendations for policymakers, educators, and the CHI research community.more » « less
-
Abstract Gamma-ray binaries are luminous in gamma rays, composed of a compact object orbiting a massive companion star. The interaction between these two objects can drive relativistic outflows, either jets or winds, in which particles can be accelerated to energies reaching hundreds of teraelectronvolts (TeV). However, it is still debated where and under which physical conditions particles are accelerated in these objects and ultimately whether protons can be accelerated up to PeV energies. Among the well-known gamma-ray binaries, LS 5039 is a high-mass X-ray binary with an orbital period of 3.9 days that has been observed up to TeV energies by the High Energy Stereoscopic System. We present new observations of LS 5039 obtained with the High Altitude Water Cherenkov (HAWC) observatory. Our data reveal that the gamma-ray spectrum of LS 5039 extends up to 200 TeV with no apparent spectral cutoff. Furthermore, we confirm, with a confidence level of 4.7σ, that the emission between 2 and 118 TeV is modulated by the orbital motion of the system, and find a 2.2σhint of variability above 100 TeV. This indicates that these photons are likely produced within or near the binary orbit, where they can undergo absorption by the stellar photons. In a leptonic scenario, the highest energy photons detected by HAWC can be emitted by ∼200 TeV electrons inverse Compton scattering stellar photons, which would require an extremely efficient acceleration mechanism operating within LS 5039. Alternatively, a hadronic scenario could explain the data through proton–proton or proton–gamma collisions of protons accelerated to petaelectronvolt energies.more » « lessFree, publicly-accessible full text available July 10, 2026
-
Free, publicly-accessible full text available May 1, 2026
-
Abstract Surface performance is critically influenced by topography in virtually all real-world applications. The current standard practice is to describe topography using one of a few industry-standard parameters. The most commonly reported number is$$R$$ a, the average absolute deviation of the height from the mean line (at some, not necessarily known or specified, lateral length scale). However, other parameters, particularly those that are scale-dependent, influence surface and interfacial properties; for example the local surface slope is critical for visual appearance, friction, and wear. The present Surface-Topography Challenge was launched to raise awareness for the need of a multi-scale description, but also to assess the reliability of different metrology techniques. In the resulting international collaborative effort, 153 scientists and engineers from 64 research groups and companies across 20 countries characterized statistically equivalent samples from two different surfaces: a “rough” and a “smooth” surface. The results of the 2088 measurements constitute the most comprehensive surface description ever compiled. We find wide disagreement across measurements and techniques when the lateral scale of the measurement is ignored. Consensus is established through scale-dependent parameters while removing data that violates an established resolution criterion and deviates from the majority measurements at each length scale. Our findings suggest best practices for characterizing and specifying topography. The public release of the accumulated data and presented analyses enables global reuse for further scientific investigation and benchmarking.more » « lessFree, publicly-accessible full text available September 1, 2026
An official website of the United States government

Full Text Available